108 research outputs found

    Distributional Properties of means of Random Probability Measures

    Get PDF
    The present paper provides a review of the results concerning distributional properties of means of random probability measures. Our interest in this topic has originated from inferential problems in Bayesian Nonparametrics. Nonetheless, it is worth noting that these random quantities play an important role in seemingly unrelated areas of research. In fact, there is a wealth of contributions both in the statistics and in the probability literature that we try to summarize in a unified framework. Particular attention is devoted to means of the Dirichlet process given the relevance of the Dirichlet process in Bayesian Nonparametrics. We then present a number of recent contributions concerning means of more general random probability measures and highlight connections with the moment problem, combinatorics, special functions, excursions of stochastic processes and statistical physics.Bayesian Nonparametrics; Completely random measures; Cifarelli–Regazzini identity; Dirichlet process; Functionals of random probability measures; Generalized Stieltjes transform; Neutral to the right processes; Normalized random measures; Posterior distribution; Random means; Random probability measure; Two–parameter Poisson–Dirichlet process.

    Models beyond the Dirichlet process

    Get PDF
    Bayesian nonparametric inference is a relatively young area of research and it has recently undergone a strong development. Most of its success can be explained by the considerable degree of exibility it ensures in statistical modelling, if compared to parametric alternatives, and by the emergence of new and ecient simulation techniques that make nonparametric models amenable to concrete use in a number of applied statistical problems. Since its introduction in 1973 by T.S. Ferguson, the Dirichlet process has emerged as a cornerstone in Bayesian nonparametrics. Nonetheless, in some cases of interest for statistical applications the Dirichlet process is not an adequate prior choice and alternative nonparametric models need to be devised. In this paper we provide a review of Bayesian nonparametric models that go beyond the Dirichlet process.

    Posterior analysis for some classes of nonparametric models

    Get PDF
    Recently, James [15, 16] has derived important results for various models in Bayesian nonparametric inference. In particular, he dened a spatial version of neutral to the right processes and derived their posterior distribution. Moreover, he obtained the posterior distribution for an intensity or hazard rate modeled as a mixture under a general multiplicative intensity model. His proofs rely on the so{called Bayesian Poisson partition calculus. Here we provide new proofs based on an alternative technique.Bayesian Nonparametrics; Completely random measure; Hazard rate; Neutral to the right prior; Multiplicative intensity model.

    Bayesian nonparametric estimators derived from conditional Gibbs structures

    Get PDF
    We consider discrete nonparametric priors which induce Gibbs-type exchangeable random partitions and investigate their posterior behavior in detail. In particular, we deduce conditional distributions and the corresponding Bayesian nonparametric estimators, which can be readily exploited for predicting various features of additional samples. The results provide useful tools for genomic applications where prediction of future outcomes is required.Bayesian nonparametric inference; Exchangeable random partitions; Generalized factorial coeffcients; Generalized gamma process; Poisson-Dirichlet process; Population genetics.

    On the stick-breaking representation of normalized inverse Gaussian priors

    Get PDF
    Random probability measures are the main tool for Bayesian nonparametric inference, with their laws acting as prior distributions. Many well-known priors used in practice admit different, though equivalent, representations. In terms of computational convenience, stick-breaking representations stand out. In this paper we focus on the normalized inverse Gaussian process and provide a completely explicit stick-breaking representation for it. This result is of interest both from a theoretical viewpoint and for statistical practice

    Flexible clustering via hidden hierarchical Dirichlet priors

    Get PDF
    The Bayesian approach to inference stands out for naturally allowing borrowing information across heterogeneous populations, with different samples possibly sharing the same distribution. A popular Bayesian nonparametric model for clustering probability distributions is the nested Dirichlet process, which however has the drawback of grouping distributions in a single cluster when ties are observed across samples. With the goal of achieving a flexible and effective clustering method for both samples and observations, we investigate a nonparametric prior that arises as the composition of two different discrete random structures and derive a closed-form expression for the induced distribution of the random partition, the fundamental tool regulating the clustering behavior of the model. On the one hand, this allows to gain a deeper insight into the theoretical properties of the model and, on the other hand, it yields an MCMC algorithm for evaluating Bayesian inferences of interest. Moreover, we single out limitations of this algorithm when working with more than two populations and, consequently, devise an alternative more efficient sampling scheme, which as a by-product, allows testing homogeneity between different populations. Finally, we perform a comparison with the nested Dirichlet process and provide illustrative examples of both synthetic and real data
    corecore